3,842 research outputs found
Higgs-- Coupling at High and Low Energy Colliders
There is no tree-level flavor changing neutral current (FCNC) in the standard
model (SM) which contains only one Higgs doublet. If more Higgs doublets are
introduced for various reasons, the tree level FCNC would be inevitable except
extra symmetry was imposed. Therefore FCNC processes are the excellent probes
for the physics beyond the SM (BSM). In this paper, we studied the lepton
flavor violated (LFV) decay processes and
induced by Higgs-- vertex. For
, its branching ratio is also related to the
, and vertices. We categorized the BSM
into two scenarios for the Higgs coupling strengths near or away from SM. For
the latter scenario, we took the spontaneously broken two Higgs doublet model
(Lee model) as an example. We considered the constraints by recent data from
LHC and B factories, and found that the measurements gave weak constraints. At
LHC Run II, will be confirmed or set stricter limit on
its branching ratio. Accordingly,
for general chosen parameters. For the positive case,
can be discovered with
pair samples at SuperB factory, Super -charm factory and new Z-factory.
The future measurements for and
will be used to distinguish these two
scenarios or set strict constraints on the correlations among different Higgs
couplings, please see Table II in the text for details.Comment: 18 pages, 10 figures, 2 table; more references added; more
discussions about cancellation in the amplitude added accoeding to the
referee's suggestion
Testing the homogeneity of the Universe using gamma-ray bursts
In this paper, we study the homogeneity of the GRB distribution using a
subsample of the Greiner GRB catalogue, which contains 314 objects with
redshift (244 of them discovered by the Swift GRB Mission). We try to
reconcile the dilemma between the new observations and the current theory of
structure formation and growth. To test the results against the possible biases
in redshift determination and the incompleteness of the Greiner sample, we also
apply our analysis to the 244 GRBs discovered by Swift and the subsample
presented by the Swift Gamma-Ray Burst Host Galaxy Legacy Survey (SHOALS). The
real space two-point correlation function (2PCF) of GRBs, is
calculated using a Landy-Szalay estimator. We perform a standard least-
fit to the measured 2PCFs of GRBs. We use the best-fit 2PCF to deduce a
recently defined homogeneity scale. The homogeneity scale, , is defined as
the comoving radius of the sphere inside which the number of GRBs is
proportional to within , or equivalently above which the correlation
dimension of the sample is within of . For the Swift
subsample of 244 GRBs, the correlation length and slope are Mpc and (at confidence level).
The corresponding scale for a homogeneous distribution of GRBs is Mpc. The results help to alleviate the tension between the new
discovery of the excess clustering of GRBs and the cosmological principle of
large-scale homogeneity. It implies that very massive structures in the
relatively local Universe do not necessarily violate the cosmological principle
and could conceivably be present.Comment: 7 pages, 5 figures, accepted by Astronomy & Astrophysics. The data
used in this work (e.g. Tables 1 and 2) are publicly available online in
electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr
(130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A
Recommended from our members
Human Capital and Hotel Operating Performance
Human capital plays an essential role in firm sucess in the hospitality industry (Baum, 2015; Tracey, 2014); however, how the mechanism through which human capital contributes to a hotel\u27s performance remains unclear (Bagri et al., 2010; Domínguez-Falcón et al., 2016; Ooi et al., 2015). By extending Hua et al. (2015) and O’Neill et al. (2008), this study systematically examined the impacts of human capital, proxied by Total Labor Expenses at different lagged time points, on hotel operating performance, while controlling for a comprehensive array of potential confounding variables. This study offers a more holistic view of whether human capital influences hotel operating performance, and if so, how. It further sheds light on explaining the mixed results from prior research. The employment of the fixed effects model framework also enables control for fixed effects variables such as chain scale and location
Recommended from our members
A Theoretical Framework of the Impact of Price Transparency on Pricing in the Lodging Industry
Facilitated by electronic market places, price transparency has gained momentum in influencing hotel pricing and aroused heightened stakeholder interest recently in the lodging industry. However, there lacks a theoretical framework that explains the impact of price transparency on pricing and widely different opinions appear to introduce confusions to industry practitioners. Therefore, this study is designed to reveal a theoretical framework of the impact of price transparency on pricing for the lodging industry and offer relevant managerial implications
Space-efficient data sketching algorithms for network applications
Sketching techniques are widely adopted in network applications. Sketching algorithms “encode” data into succinct data structures that can later be accessed and “decoded” for various purposes, such as network measurement, accounting, anomaly detection and etc. Bloom filters and counter braids are two well-known representatives in this category. Those sketching algorithms usually need to strike a tradeoff between performance (how much information can be revealed and how fast) and cost (storage, transmission and computation). This dissertation is dedicated to the
research and development of several sketching techniques including improved forms of stateful Bloom Filters, Statistical Counter Arrays and Error Estimating Codes. Bloom filter is a space-efficient randomized data structure for approximately representing a set in order to support membership queries. Bloom filter and its variants have found widespread use in many networking applications, where it is important to minimize the cost of storing and communicating network data. In this thesis, we propose a family of Bloom Filter variants augmented by rank-indexing method. We will show such augmentation can bring a significant reduction of space and also the number of memory accesses, especially when deletions of set elements from the Bloom Filter need to be supported. Exact active counter array is another important building block in many sketching algorithms, where storage cost of the array is of paramount concern. Previous approaches reduce the storage costs while either losing accuracy or supporting only passive measurements. In this thesis, we propose an exact statistics counter array architecture that can support active measurements (real-time read and write). It also leverages the aforementioned rank-indexing method and exploits statistical multiplexing to minimize the storage
costs of the counter array. Error estimating coding (EEC) has recently been established as an important tool to estimate bit error rates in the transmission of packets over wireless links. In essence, the EEC problem is also a sketching problem, since the EEC codes can be viewed as a sketch of the packet sent, which is decoded by the receiver to estimate bit error rate. In this thesis, we will first investigate the asymptotic bound of error estimating coding by viewing the problem from two-party computation perspective and then investigate its coding/decoding efficiency using Fisher information analysis. Further, we develop several sketching techniques including Enhanced tug-of-war(EToW) sketch and the generalized EEC (gEEC)sketch family which can achieve around 70% reduction of sketch size with similar estimation accuracies. For all solutions proposed above, we will use theoretical tools such as information theory and communication complexity to investigate how far our proposed solutions are away from the theoretical optimal. We will show that the proposed techniques are asymptotically or empirically very close to the theoretical bounds.PhDCommittee Chair: Xu, Jun; Committee Member: Feamster, Nick; Committee Member: Li, Baochun; Committee Member: Romberg, Justin; Committee Member: Zegura, Ellen W
Efficient Optimization of Performance Measures by Classifier Adaptation
In practical applications, machine learning algorithms are often needed to
learn classifiers that optimize domain specific performance measures.
Previously, the research has focused on learning the needed classifier in
isolation, yet learning nonlinear classifier for nonlinear and nonsmooth
performance measures is still hard. In this paper, rather than learning the
needed classifier by optimizing specific performance measure directly, we
circumvent this problem by proposing a novel two-step approach called as CAPO,
namely to first train nonlinear auxiliary classifiers with existing learning
methods, and then to adapt auxiliary classifiers for specific performance
measures. In the first step, auxiliary classifiers can be obtained efficiently
by taking off-the-shelf learning algorithms. For the second step, we show that
the classifier adaptation problem can be reduced to a quadratic program
problem, which is similar to linear SVMperf and can be efficiently solved. By
exploiting nonlinear auxiliary classifiers, CAPO can generate nonlinear
classifier which optimizes a large variety of performance measures including
all the performance measure based on the contingency table and AUC, whilst
keeping high computational efficiency. Empirical studies show that CAPO is
effective and of high computational efficiency, and even it is more efficient
than linear SVMperf.Comment: 30 pages, 5 figures, to appear in IEEE Transactions on Pattern
Analysis and Machine Intelligence, 201
- …